Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Revert Py2 #8

Open
wants to merge 1 commit into
base: databricks
Choose a base branch
from

Conversation

lihaoyi-databricks
Copy link

@lihaoyi-databricks lihaoyi-databricks commented Oct 10, 2023

Prospective revert of #5. That PR never made it into the version of rules_docker used in databricks/runtime, and seems to cause a test failure when I included it:

$ bazel run //connector/spark-jenkins/docker:spark-jenkins-it-image_finished_setup_new_rules_test 
Using Bazel Cache at grpcs://bazel-remote-grpc.private.staging.cloud.databricks.com
Using Scala Incremental Compilation
(02:28:58) INFO: Invocation ID: 6e77b23e-1cc1-4f69-a81e-4b5066b07dd4
(02:28:58) INFO: Current date is 2023-10-10
(02:28:59) INFO: Analyzed target //connector/spark-jenkins/docker:spark-jenkins-it-image_finished_setup_new_rules_test (60 packages loaded, 1122 targets configured).
(02:28:59) INFO: Found 1 target...
(02:29:01) WARNING: DownloadArtifacts multiplex-worker (id 2) can no longer be used, because its files have changed on disk:
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/artifacts/constants/constants-jetty9-hadoop1_2.12_deploy.jar: 337830bbdf7f65139f3e479becd05694200b034f824ca6d51a8b6b5882181e24 -> b1d4529635350b7f473f53d91a9f671f7683bf10ca4872fe7b858d7d183f2b7d
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/artifacts/downloader/downloader-jetty9-hadoop1_2.12_deploy.jar: 6031627a8173454463fea5bc52d088e142c1910e5dea3e466b28dce21b710092 -> 31f0ddd946b698a108108f0c98a97fe7340e99eb666fd8dc5bbadd1a3245f700
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/artifacts/hashing/hashing-jetty9-hadoop1_2.12_deploy.jar: 4dd61d8822e99d18e2c898d75275a83b3840297cd61ccae8f7952b790f8dd109 -> d1dc0a144ab7a8bf71aeeccb877a3d0c50a3a5d95af7176916d43a62bd3c535f
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/artifacts/proxying/proxying-jetty9-hadoop1_2.12_deploy.jar: 1a551fba0e25acfe5304a8c0e5f636232ed99dd6df931ccedb842df1a1452f71 -> 4d5bc370292ded3d59fbfc92a8e88a100b4669784180b6b1ab3b565dd3948797
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/artifacts/worker/lib-jetty9-hadoop1_2.12_deploy.jar: 33bd4b273cd8d2d0459515ea11c409829881fdf38786bf2264db9796a2eae022 -> 0dcbbdb44cdc904808981c8344b10cf98fe9c2310b4d4eb7f165f48e8cbd6b71
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/artifacts/worker/worker_deploy.jar: cc53da3852b42f4b1567d49ace6ea5369d5d69503da883e6da2911c9aa0419a3 -> f366b5c5d364d6eae6f8cdd91683f9c2b277d61737e71c2e8ec36e41182cdca9
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/worker/bazel-worker-base-jetty9-hadoop1_2.12_deploy.jar: 14827d1924d70316ec21421e42ee98a4ffce394242a7640ef9ce18b93b32fc85 -> 220bddf425aa45057c7f593892a5d4f06d1b757f0055af88d0bd94736c37a51d
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/runtime/external/universe/bazel/worker/log4j_config-jetty9-hadoop1_2.12_deploy.jar: 850f7622db0b3ab11af27010e365768756e7198969b81ee4964866ec4da73884 -> 35932d031b6380a8e22d3bee8e02356859fec9c1b95efecd259e459263737842
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/artifacts/constants/constants-jetty9-hadoop1_2.12_deploy.jar: 337830bbdf7f65139f3e479becd05694200b034f824ca6d51a8b6b5882181e24 -> b1d4529635350b7f473f53d91a9f671f7683bf10ca4872fe7b858d7d183f2b7d
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/artifacts/downloader/downloader-jetty9-hadoop1_2.12_deploy.jar: 6031627a8173454463fea5bc52d088e142c1910e5dea3e466b28dce21b710092 -> 31f0ddd946b698a108108f0c98a97fe7340e99eb666fd8dc5bbadd1a3245f700
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/artifacts/hashing/hashing-jetty9-hadoop1_2.12_deploy.jar: 4dd61d8822e99d18e2c898d75275a83b3840297cd61ccae8f7952b790f8dd109 -> d1dc0a144ab7a8bf71aeeccb877a3d0c50a3a5d95af7176916d43a62bd3c535f
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/artifacts/proxying/proxying-jetty9-hadoop1_2.12_deploy.jar: 1a551fba0e25acfe5304a8c0e5f636232ed99dd6df931ccedb842df1a1452f71 -> 4d5bc370292ded3d59fbfc92a8e88a100b4669784180b6b1ab3b565dd3948797
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/artifacts/worker/lib-jetty9-hadoop1_2.12_deploy.jar: 33bd4b273cd8d2d0459515ea11c409829881fdf38786bf2264db9796a2eae022 -> 0dcbbdb44cdc904808981c8344b10cf98fe9c2310b4d4eb7f165f48e8cbd6b71
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/artifacts/worker/worker_deploy.jar: cc53da3852b42f4b1567d49ace6ea5369d5d69503da883e6da2911c9aa0419a3 -> f366b5c5d364d6eae6f8cdd91683f9c2b277d61737e71c2e8ec36e41182cdca9
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/worker/bazel-worker-base-jetty9-hadoop1_2.12_deploy.jar: 14827d1924d70316ec21421e42ee98a4ffce394242a7640ef9ce18b93b32fc85 -> 220bddf425aa45057c7f593892a5d4f06d1b757f0055af88d0bd94736c37a51d
bazel-out/host/bin/external/universe/bazel/artifacts/download.runfiles/universe/bazel/worker/log4j_config-jetty9-hadoop1_2.12_deploy.jar: 850f7622db0b3ab11af27010e365768756e7198969b81ee4964866ec4da73884 -> 35932d031b6380a8e22d3bee8e02356859fec9c1b95efecd259e459263737842
(02:29:01) INFO: Destroying DownloadArtifacts multiplex-worker (id 2)
(02:29:01) INFO: Multiplexer process for DownloadArtifacts has closed its output stream
(02:29:01) INFO: Created new non-sandboxed DownloadArtifacts multiplex-worker (id 3), logging to /ephemeral/home/haoyi.li/.cache/bazel/_bazel_haoyi.li/529b17f0d7eccfb75b7cc34a49a29224/bazel-workers/multiplex-worker-3-DownloadArtifacts.log
Target //connector/spark-jenkins/docker:spark-jenkins-it-image_finished_setup_new_rules_test up-to-date:
  bazel-bin/connector/spark-jenkins/docker/spark-jenkins-it-image_finished_setup_new_rules_test
(02:29:07) INFO: Elapsed time: 9.163s, Critical Path: 7.95s
(02:29:07) INFO: 343 processes: 277 remote cache hit, 44 internal, 21 linux-sandbox, 1 worker.
(02:29:07) INFO: Running command line: external/bazel_tools/tools/test/test-setup.sh connector/spark-jenkins/docker/spark-jenkins-it-image_finished_setup_new_rules_test
(02:29:07) INFO: Build Event Protocol files produced successfully.
(02:29:07) INFO: Build completed successfully, 343 total actions
exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //connector/spark-jenkins/docker:spark-jenkins-it-image_finished_setup_new_rules_test
-----------------------------------------------------------------------------
IDs in config file do not match generated IDs
Config IDs:
	sha256:88cc1a200eb9be206c4261e79d642c392d97980236a066f23434f4452f8212a7
	sha256:bf509d6bc5ecd3b1a3660fb5f167ce2e320f5cc532574217cd8ba179ca06bae9
	sha256:2af0e1f1e531af9c95e30c3f908f6431d635a173b866e39b5f480e9ff3a180b9
	sha256:4b07a7bca5b71a13357fc3f96dab8dd67a2ffb139410cd2fcd74a61fa8225255
	sha256:ec2571a0ee18471d728d3a67d909afcc922d8b0e49a3ab55c46663acba9a3808
	sha256:258dda35c1008b877e1bd1ca1a6e3dd575fb75f54d61765388d72d5893bc7ca3
	sha256:be28fd825c0f4d3e4f43019506f2541622522aeeb992d651aa310000265cf129
	sha256:7b384e9e4ed40ec506d8aa34d8c59442ad3910538feb8b50034c15be28776e70
	sha256:5e80a86007e5587eb359ebe2ab02b0034b63490321d7142af796626b2f537e77
	sha256:86c63e99c2265bc95e6c366d5d99e9f16674b0cc982293c57c300b29ead0fe15
	sha256:cd223edb84931690363255c14c8874ab047dafd096814cce98cf8f4ce3d563fe
	sha256:2d3e900e23fe83e7c8fc96d104a668e2e3da205cb70a474e3954aaf4270783ba
	sha256:fb5d885df3ff05e4c45ede3d8609eea12aa9b685b8db13236f68ee2c5b0d6973
	sha256:18305fd2ecf86f18f3db068c4816e06e2dbe04c4ccff50a6545b0ce37abae72c
	sha256:ffc02492ca45a56186cfff17a2be8a8a19c4f0c718de11d13ed28fd51d9ce4fe
	sha256:4f372f7ef8230b7638be23732368c0ab29a2f666167cc78192305a1178d61e05
Generated IDs:
	sha256:88cc1a200eb9be206c4261e79d642c392d97980236a066f23434f4452f8212a7
	sha256:bf509d6bc5ecd3b1a3660fb5f167ce2e320f5cc532574217cd8ba179ca06bae9
	sha256:2af0e1f1e531af9c95e30c3f908f6431d635a173b866e39b5f480e9ff3a180b9
	sha256:4b07a7bca5b71a13357fc3f96dab8dd67a2ffb139410cd2fcd74a61fa8225255
	sha256:ec2571a0ee18471d728d3a67d909afcc922d8b0e49a3ab55c46663acba9a3808
	sha256:258dda35c1008b877e1bd1ca1a6e3dd575fb75f54d61765388d72d5893bc7ca3
	sha256:be28fd825c0f4d3e4f43019506f2541622522aeeb992d651aa310000265cf129
	sha256:7b384e9e4ed40ec506d8aa34d8c59442ad3910538feb8b50034c15be28776e70
	sha256:5e80a86007e5587eb359ebe2ab02b0034b63490321d7142af796626b2f537e77
	sha256:86c63e99c2265bc95e6c366d5d99e9f16674b0cc982293c57c300b29ead0fe15
	sha256:cd223edb84931690363255c14c8874ab047dafd096814cce98cf8f4ce3d563fe
	sha256:2d3e900e23fe83e7c8fc96d104a668e2e3da205cb70a474e3954aaf4270783ba
	sha256:fb5d885df3ff05e4c45ede3d8609eea12aa9b685b8db13236f68ee2c5b0d6973
	sha256:18305fd2ecf86f18f3db068c4816e06e2dbe04c4ccff50a6545b0ce37abae72c
	sha256:6c117b0acfe9e1d8c1646cc9f11014816df12da2dc11636891ae810d65ab016b
	sha256:4f372f7ef8230b7638be23732368c0ab29a2f666167cc78192305a1178d61e05

For now, this PR allows us to preserve the status quo by removing that PR from the branch used in databricks/runtime, while including more recent changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant